# Hierarchical Transformer
Hiera Base 224
Hiera is an efficient hierarchical Transformer architecture that optimizes spatial bias characteristics through MAE training, significantly improving parameter utilization efficiency
Image Classification
Transformers

H
namangarg110
48
0
Mvp Multi Task
Apache-2.0
The MVP Multi-task Model is a prompt-based pre-trained model optimized with mixed annotated datasets, specifically designed for various natural language generation tasks.
Large Language Model
Transformers Supports Multiple Languages

M
RUCAIBox
20
2
Featured Recommended AI Models